AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
128K Long-Context Inference

# 128K Long-Context Inference

Llama 3 3 Nemotron Super 49B V1 GGUF
Other
Llama-3.3-Nemotron-Super-49B-v1 is a large language model, improved upon Meta Llama-3.3-70B-Instruct, with enhanced reasoning capabilities, human chat preferences, and task execution abilities, supporting a context length of 128K tokens.
Large Language Model Transformers English
L
unsloth
814
1
Llama 3.1 Nemotron Nano 8B V1 GGUF
Other
Llama-3.1-Nemotron-Nano-8B-v1 is an inference model based on Meta Llama-3.1-8B-Instruct, enhanced through post-training to improve reasoning capabilities, human chat preferences, and task execution.
Large Language Model Transformers English
L
unsloth
22.18k
3
Glm 4 9b Chat Hf
Other
GLM-4-9B is the latest open-source version of the GLM-4 series pre-trained model developed by Zhipu AI, featuring exceptional capabilities in semantics, mathematics, reasoning, coding, and knowledge.
Large Language Model Transformers Supports Multiple Languages
G
THUDM
7,919
13
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase